Skip to content

Conversation

@konflux-internal-p02
Copy link

@konflux-internal-p02 konflux-internal-p02 bot commented Oct 14, 2025

This PR contains the following updates:

Package Change Age Confidence
accelerate ==1.10.0 -> ==1.11.0 age confidence

Warning

Some dependencies could not be looked up. Check the warning logs for more information.


Release Notes

huggingface/accelerate (accelerate)

v1.11.0: : TE MXFP8, FP16/BF16 with MPS, Python 3.10

Compare Source

TE MXFP8 support

We've added support for MXFP8 in our TransformerEngine integration. To use that, you need to set use_mxfp8_block_scaling in fp8_config. See nvidia docs [here]. (https://docs.nvidia.com/deeplearning/transformer-engine/user-guide/examples/fp8_primer.html#MXFP8-and-block-scaling)

FP16/BF16 Training for MPS devices

BF16 and FP16 support for MPS devices is finally here. You can now pass mixed_precision = "fp16" or "bf16" when training on a mac (fp16 requires torch 2.8 and bf16 requires torch 2.6)

FSDP updates

The following PRs add respectively support to ignored_params and no_sync() for FSDPv2:

Mixed precision can now be passed as a dtype string from accelerate cli flag or fsdp_config in accelerate config file:

Nd-parallel updates

Some minor updates concerning nd-parallelism.

Bump to Python 3.10

We've dropped support for python 3.9 as it reached EOL in October.

Lots of minor fixes:
New Contributors

Full Changelog: huggingface/accelerate@v1.10.1...v1.11.0

v1.10.1: : Patchfix

Compare Source

Full Changelog: huggingface/accelerate@v1.10.0...v1.10.1


Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

To execute skipped test pipelines write comment /ok-to-test.

This PR has been generated by MintMaker (powered by Renovate Bot).

@konflux-internal-p02 konflux-internal-p02 bot force-pushed the konflux/mintmaker/rhoai-3.0/accelerate-1.x branch from 94489a2 to eb8adc0 Compare October 17, 2025 20:19
Signed-off-by: konflux-internal-p02 <170854209+konflux-internal-p02[bot]@users.noreply.github.com>
@konflux-internal-p02 konflux-internal-p02 bot force-pushed the konflux/mintmaker/rhoai-3.0/accelerate-1.x branch from eb8adc0 to ef81d21 Compare October 21, 2025 00:25
@konflux-internal-p02 konflux-internal-p02 bot changed the title Update dependency accelerate to v1.10.1 Update dependency accelerate to v1.11.0 Oct 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants